Consistency and Generalization in Incrementally Trained Connectionist Networks

نویسنده

  • Tony Martinez
چکیده

This paper discusses aspects of consistency and generalization in connectionist networks which learn through incremental training by examples or rules. Differences between training set learning and incremental rule or example learning are presented. Generalization, the ability to output reasonable mappings when presented with novel input patterns, is discussed in light of the above learning methods. In particular, the contrast between hamming distance generalization and generalizing by high order combinations of critical variables is overviewed. Examples of detailed rules for an incremental learning model are presented for both consistency and generalization constraints.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Computational modeling of dynamic decision making using connectionist networks

In this research connectionist modeling of decision making has been presented. Important areas for decision making in the brain are thalamus, prefrontal cortex and Amygdala. Connectionist modeling with 3 parts representative for these 3 areas is made based the result of Iowa Gambling Task. In many researches Iowa Gambling Task is used to study emotional decision making. In these kind of decisio...

متن کامل

Fractal Analysis Illuminates the Form of Connectionist Structural Gradualness

We examine two connectionist networks-a fractal learning neural network (FLNN) and a Simple Recurrent Network (SRN)-that are trained to process center-embedded symbol sequences. Previous work provides evidence that connectionist networks trained on infinite-state languages tend to form fractal encodings. Most such work focuses on simple counting recursion cases (e.g., anbn), which are not compa...

متن کامل

Monitoring of Regional Low-Flow Frequency Using Artificial Neural Networks

Ecosystem of arid and semiarid regions of the world, much of the country lies in the sensitive and fragile environment Canvases are that factors in the extinction and destruction are easily destroyed in this paper, artificial neural networks (ANNs) are introduced to obtain improved regional low-flow estimates at ungauged sites. A multilayer perceptron (MLP) network is used to identify the funct...

متن کامل

Modularity and Scaling in Large Phonemic Neural Networks

Scaling connectionist models to larger connectionist systems is difficult because larger networks require increasing amounts of training time and data, and the complexity of the optimization task quickly reaches computationally unmanageable proportions. In this paper, we train several small Time-Delay Neural Networks aimed at all phonemic subcategories (nasals, fricatives, etc.) and report exce...

متن کامل

Birth of an Abstraction

Human participants and recurrent (“connectionist”) neural networks were both trained on a categorization system abstractly similar to natural language systems involving irregular (“strong”) classes and a default class. Both the humans and the networks exhibited staged learning and a generalization pattern reminiscent of the Elsewhere Condition (Kiparsky, 1973). Previous connectionist accounts o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1990